Multilinear Spectral Regularization for Kernel-based Multitask Learning

نویسنده

  • Marco Signoretto
چکیده

Recent research in machine learning witnessed a renewed interest in tensors. In particular, multilinear algebra has been leveraged to derive structured finite dimensional parametric models [1, 2]. In [3] these ideas have been generalized to reproducing kernel Hilbert spaces. The arising framework comprises existing problem formulations, such as tensor completion [4], as well as novel functional formulations. The approach is based on a class of regularizers for tensor product functions, termed multilinear spectral penalties, that is related to spectral regularization for operator estimation [5]. In this work we outline the main ideas and focus on the implications for (multilinear) multitask learning. Multi-task Learning (MTL) aims at simultaneously finding multiple predictive models, each of which corresponds to a learning task. In many cases of interest MTL has been shown to improve over the case where tasks are learned in isolation, see [6] and references therein. Importantly, the approach allows one to make predictions even in absence of training data for one or more of the tasks. Therefore it is suitable to perform transfer learning [7]. Recently [2] has proposed an extension, termed Multilinear Multi-task Learning (MLMTL), to account for multi-modal interactions between the tasks. This is a departure from classical tensorbased methods, where the multilinear decomposition is performed on the input data.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Tensors in Reproducing Kernel Hilbert Spaces with Multilinear Spectral Penalties

We present a general framework to learn functions in tensor product reproducing kernel Hilbert spaces (TP-RKHSs). The methodology is based on a novel representer theorem suitable for existing as well as new spectral penalties for tensors. When the functions in the TP-RKHS are defined on the Cartesian product of finite discrete sets, in particular, our main problem formulation admits as a specia...

متن کامل

Multi-Task Multiple Kernel Relationship Learning

This paper presents a novel multitask multiple kernel learning framework that efficiently learns the kernel weights leveraging the relationship across multiple tasks. The idea is to automatically infer this task relationship in the RKHS space corresponding to the given base kernels. The problem is formulated as a regularization-based approach called MultiTask Multiple Kernel Relationship Learni...

متن کامل

Excess risk bounds for multitask learning with trace norm regularization

Trace norm regularization is a popular method of multitask learning. We give excess risk bounds with explicit dependence on the number of tasks, the number of examples per task and properties of the data distribution. The bounds are independent of the dimension of the input space, which may be infinite as in the case of reproducing kernel Hilbert spaces. A byproduct of the proof are bounds on t...

متن کامل

Multilinear Multitask Learning

Many real world datasets occur or can be arranged into multi-modal structures. With such datasets, the tasks to be learnt can be referenced by multiple indices. Current multitask learning frameworks are not designed to account for the preservation of this information. We propose the use of multilinear algebra as a natural way to model such a set of related tasks. We present two learning methods...

متن کامل

Multitask learning meets tensor factorization: task imputation via convex optimization

We study a multitask learning problem in which each task is parametrized by a weight vector and indexed by a pair of indices, which can be e.g, (consumer, time). The weight vectors can be collected into a tensor and the (multilinear-)rank of the tensor controls the amount of sharing of information among tasks. Two types of convex relaxations have recently been proposed for the tensor multilinea...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013